29 research outputs found

    Enhancing the Accuracy of Synthetic File System Benchmarks

    Get PDF
    File system benchmarking plays an essential part in assessing the file system’s performance. It is especially difficult to measure and study the file system’s performance as it deals with several layers of hardware and software. Furthermore, different systems have different workload characteristics so while a file system may be optimized based on one given workload it might not perform optimally based on other types of workloads. Thus, it is imperative that the file system under study be examined with a workload equivalent to its production workload to ensure that it is optimized according to its usage. The most widely used benchmarking method is synthetic benchmarking due to its ease of use and flexibility. The flexibility of synthetic benchmarks allows system designers to produce a variety of different workloads that will provide insight on how the file system will perform under slightly different conditions. The downside of synthetic workloads is that they produce generic workloads that do not have the same characteristics as production workloads. For instance, synthetic benchmarks do not take into consideration the effects of the cache that can greatly impact the performance of the underlying file system. In addition, they do not model the variation in a given workload. This can lead to file systems not optimally designed for their usage. This work enhanced synthetic workload generation methods by taking into consideration how the file system operations are satisfied by the lower level function calls. In addition, this work modeled the variations of the workload’s footprint when present. The first step in the methodology was to run a given workload and trace it by a tool called tracefs. The collected traces contained data on the file system operations and the lower level function calls that satisfied these operations. Then the trace was divided into chunks sufficiently small enough to consider the workload characteristics of that chunk to be uniform. Then the configuration file that modeled each chunk was generated and supplied to a synthetic workload generator tool that was created by this work called FileRunner. The workload definition for each chunk allowed FileRunner to generate a synthetic workload that produced the same workload footprint as the corresponding segment in the original workload. In other words, the synthetic workload would exercise the lower level function calls in the same way as the original workload. Furthermore, FileRunner generated a synthetic workload for each specified segment in the order that they appeared in the trace that would result in a in a final workload mimicking the variation present in the original workload. The results indicated that the methodology can create a workload with a throughput within 10% difference and with operation latencies, with the exception of the create latencies, to be within the allowable 10% difference and in some cases within the 15% maximum allowable difference. The work was able to accurately model the I/O footprint. In some cases the difference was negligible and in the worst case it was at 2.49% difference

    Postmortem computed tomography for diagnosis of cause of death in male prisoners

    Get PDF
    Objective: To determine the utility of postmortem CT (PMCT) examination in establishing the cause of death among male prisoners dying in Karachi jails.Methods: A descriptive study was carried out from February 2006 to September 2007, CT Scan section, Civil Hospital Karachi and the Mortuary, Dow Medical College, Dow University of Health Sciences, Karachi. Adult male prisoners dying in the Karachi central prison and referred to the study setting for determining the cause of death for medico legal purpose were included. Female prisoners and those cases where the final report of cause of death was not available were excluded. CT scan of the vital body regions (head, neck, thorax, abdomen and pelvis) was carried out in all cases. The scan was read and reported by two radiologists. Anatomical dissection based autopsy was carried out by the forensic expert. Final report regarding the cause of death was issued by the forensic expert based on the combined findings, histopathology, toxicology results and circumstantial evidence. The CT scan and autopsy findings were compared and percentage agreement was determined using kappa statistics.Results: There were 14 cases in all with mean age of 41.2 +/- 17 years. The alleged mode of death was custodial torture in all cases. CT scan determined the cause of death to be natural cardio-respiratory failure in 10, strangulation in 01, pulmonary tuberculosis (TB) in 02 and trauma to spine in 01 case. The autopsy determined natural death in 11 and pulmonary TB in 02 and asphyxia in 01. The percentage agreement between CT and autopsy was 92% (k = 0.92) and between CT and finalized cause of death was 100% (k = 1.0).Conclusion: PMCT is as effective as dissection autopsy in identifying pulmonary infections and natural causes of death. It is more effective in identifying vertebral fractures which may exclude hanging and corroborate trauma to spine

    Transcription factor AP-1 in esophageal squamous cell carcinoma: Alterations in activity and expression during Human Papillomavirus infection

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Esophageal squamous cell carcinoma (ESCC) is a leading cause of cancer-related deaths in Jammu and Kashmir (J&K) region of India. A substantial proportion of esophageal carcinoma is associated with infection of high-risk HPV type 16 and HPV18, the oncogenic expression of which is controlled by host cell transcription factor Activator Protein-1 (AP-1). We, therefore, have investigated the role of DNA binding and expression pattern of AP-1 in esophageal cancer with or without HPV infection.</p> <p>Methods</p> <p>Seventy five histopathologically-confirmed esophageal cancer and an equal number of corresponding adjacent normal tissue biopsies from Kashmir were analyzed for HPV infection, DNA binding activity and expression of AP-1 family of proteins by PCR, gel shift assay and immunoblotting respectively.</p> <p>Results</p> <p>A high DNA binding activity and elevated expression of AP-1 proteins were observed in esophageal cancer, which differed between HPV positive (19%) and HPV negative (81%) carcinomas. While JunB, c-Fos and Fra-1 were the major contributors to AP-1 binding activity in HPV negative cases, Fra-1 was completely absent in HPV16 positive cancers. Comparison of AP-1 family proteins demonstrated high expression of JunD and c-Fos in HPV positive tumors, but interestingly, Fra-1 expression was extremely low or nil in these tumor tissues.</p> <p>Conclusion</p> <p>Differential AP-1 binding activity and expression of its specific proteins between HPV - positive and HPV - negative cases indicate that AP-1 may play an important role during HPV-induced esophageal carcinogenesis.</p

    Large-scale unit commitment under uncertainty: an updated literature survey

    Get PDF
    The Unit Commitment problem in energy management aims at finding the optimal production schedule of a set of generation units, while meeting various system-wide constraints. It has always been a large-scale, non-convex, difficult problem, especially in view of the fact that, due to operational requirements, it has to be solved in an unreasonably small time for its size. Recently, growing renewable energy shares have strongly increased the level of uncertainty in the system, making the (ideal) Unit Commitment model a large-scale, non-convex and uncertain (stochastic, robust, chance-constrained) program. We provide a survey of the literature on methods for the Uncertain Unit Commitment problem, in all its variants. We start with a review of the main contributions on solution methods for the deterministic versions of the problem, focussing on those based on mathematical programming techniques that are more relevant for the uncertain versions of the problem. We then present and categorize the approaches to the latter, while providing entry points to the relevant literature on optimization under uncertainty. This is an updated version of the paper "Large-scale Unit Commitment under uncertainty: a literature survey" that appeared in 4OR 13(2), 115--171 (2015); this version has over 170 more citations, most of which appeared in the last three years, proving how fast the literature on uncertain Unit Commitment evolves, and therefore the interest in this subject

    Refining and Reasoning about Nonfunctional Requirements

    No full text
    Nonfunctional requirements (NFR) must be addressed early in the software development cycle to avoid the cost of revisiting those requirements or re-factoring at the later stages of the development cycle. Methods and frameworks that identify and incorporate NFR at each stage of development cycle reduce this cost. The methodology used in this work for refining and reasoning about NFR is based on the NFR framework. This work identifies four NFR types and provides the methodology for developing domain specific NFR by using techniques for converting the requirements into design artifacts per NFR type. The contribution is four NFR types: Functionally Restrictive, Additive Restrictive, Policy Restrictive, and Architecture Restrictive and the software engineering process that provides specific refinements that result in unique architectural and design artifacts. By applying the same functional requirement focus to the different NFR domains it enhances the development process and promotes software quality attributes such as composability, maintainability, evolvability, and traceability

    Using Aspects for Testing Nonfunctional Requirements in Object-Oriented Systems

    No full text
    Software testing is one of the most time consuming activities in the software development cycle. Current research suggests that aspect-oriented programming (AOP) can enhance testing and has the potential to be more effective than macros or test interfaces. There are two major weaknesses when using aspects which are the inability of aspect code to be woven at all execution points and the lack of direct support for interweaving aspects with other aspects. In this paper we address the two major weaknesses and provide a means to overcome them. In addition, current research has focused only on using aspects to test functional requirements (FRs) and paid little attention to nonfunctional requirements (NFRs). In this paper we perform a feasibility study of using aspects to test NFRs which is based on two categorizations of NFRs. The first categorization splits NFRs into four types namely functionally restrictive, additive restrictive, policy restrictive, and architecturally restrictive and the second categorization splits the NFRs into two types: operational and nonoperational. These categorizations would serve as an initial point for developing frameworks or methodologies for testing NFRs with aspects

    Removal of Heavy Metals from Drinking Water by Magnetic Carbon Nanostructures Prepared from Biomass

    No full text
    Heavy metals contamination of drinking water has significant adverse effects on human health due to their toxic nature. In this study a new adsorbent, magnetic graphitic nanostructures were prepared from watermelon waste. The adsorbent was characterized by different instrumental techniques (surface area analyzer, FTIR, XRD, EDX, SEM, and TG/DTA) and was used for the removal of heavy metals (As, Cr, Cu, Pb, and Zn) from water. The adsorption parameters were determined for heavy metals adsorption using Freundlich and Langmuir isotherms. The adsorption kinetics and effect of time, pH, and temperature on heavy metal ions were also determined. The best fits were obtained for Freundlich isotherm. The percent adsorption showed a decline at high pH. Best fit was obtained with second-order kinetics model for the kinetics experiments. The values of ΔH° and ΔG° were negative while that of ΔS° was positive. The prepared adsorbent has high adsorption capacities and can be efficiently used for the removal of heavy metals from water

    Risk factors and quality of life of dyslipidemic patients in Lebanon: A cross-sectional study

    No full text
    The main objective of this study was to identify the risk factors of dyslipidemia and measure its impact on patients’ quality of life (QOL). Secondary objectives were to determine the percentage of dyslipidemia and assess the predictive factors affecting patients’ QOL. A cross-sectional study was conducted in a sample of Lebanese population. A standardized questionnaire was developed to assess the QOL using the Short form-36 (SF-36) score. A total of 452 individuals were interviewed, of which 59.5% were females. The mean age was 43.3 ± 15.6 years, and 24.8% had dyslipidemia. The results show a lower overall QOL score among dyslipidemic patients compared with controls (57.9% and 76.5%, respectively; p < 0.001). Waterpipe smoking [adjusted odds ratio (ORa) = 4.113, 95% confidence interval (CI): 1.696–9.971, p = 0.002], hypertension (ORa = 3.597, 95% CI: 1.818–7.116, p < 0.001), diabetes (ORa = 3.441, 95% CI: 1.587–7.462, p = 0.002), cigarette smoking (ORa = 2.966, 95% CI: 1.516–5.804, p = 0.001), and passive smoking (ORa = 2.716, 95% CI: 1.376–5.358, p = 0.004) were significantly associated with dyslipidemia in individuals older than 30 years. A higher overall QOL score (p = 0.013) was observed in patients treated with statins in comparison with other lipid-lowering medications. In addition to clinical and economical consequences, dyslipidemia may have a significant impact on patients’ QOL. Further research is needed to confirm the impact of treatment on dyslipidemic patients’ QOL in order to maximize the overall benefits of therapy

    Polysaccharides and lignin based hydrogels with potential pharmaceutical use as a drug delivery system produced by a reactive extrusion process

    No full text
    International audienceCurrently, there is very strong interest to replace synthetic polymers with biological macromolecules of natural source for applications that interact with humans or the environment. This research describes the development of drug delivery hydrogels from natural polymers, starch, lignin and hemicelluloses by means of reactive extrusion. The hydrogels show a strong swelling ability dependent on pH which may be used to control diffusion rates of water and small molecules in and out of the gel. Also the hydrogels degradation rates were studied in a physiological solution (pH 7.4) for 15 days. The results indicated that for all three macromolecules, lower molecular weight and higher level of plasticizer both increase the rate of weight loss of the hydrogels. The degradation was extremely reduced when the polymers were extruded in the presence of a catalyst. Finally the dynamic mechanical analysis revealed that the degradation of the hydrogels induce a significant reduction in the compressive modulus. This study demonstrates the characteristics and potential of natural polymers as a drug release system
    corecore